65,180 research outputs found

    Autonomic computing architecture for SCADA cyber security

    Get PDF
    Cognitive computing relates to intelligent computing platforms that are based on the disciplines of artificial intelligence, machine learning, and other innovative technologies. These technologies can be used to design systems that mimic the human brain to learn about their environment and can autonomously predict an impending anomalous situation. IBM first used the term ‘Autonomic Computing’ in 2001 to combat the looming complexity crisis (Ganek and Corbi, 2003). The concept has been inspired by the human biological autonomic system. An autonomic system is self-healing, self-regulating, self-optimising and self-protecting (Ganek and Corbi, 2003). Therefore, the system should be able to protect itself against both malicious attacks and unintended mistakes by the operator

    Autonomic computing meets SCADA security

    Get PDF
    © 2017 IEEE. National assets such as transportation networks, large manufacturing, business and health facilities, power generation, and distribution networks are critical infrastructures. The cyber threats to these infrastructures have increasingly become more sophisticated, extensive and numerous. Cyber security conventional measures have proved useful in the past but increasing sophistication of attacks dictates the need for newer measures. The autonomic computing paradigm mimics the autonomic nervous system and is promising to meet the latest challenges in the cyber threat landscape. This paper provides a brief review of autonomic computing applications for SCADA systems and proposes architecture for cyber security

    Efficient Energy Transport in Photosynthesis: Roles of Coherence and Entanglement

    Full text link
    Recently it has been discovered---contrary to expectations of physicists as well as biologists---that the energy transport during photosynthesis, from the chlorophyll pigment that captures the photon to the reaction centre where glucose is synthesised from carbon dioxide and water, is highly coherent even at ambient temperature and in the cellular environment. This process and the key molecular ingredients that it depends on are described. By looking at the process from the computer science view-point, we can study what has been optimised and how. A spatial search algorithmic model based on robust features of wave dynamics is presented.Comment: 6 pages, 3 figures, to appear in the proceedings of the Symposium "75 Years of Quantum Entanglement: Foundations and Information Theoretic Applications", January 2011, Kolkata, Indi

    Towards Understanding the Origin of Genetic Languages

    Full text link
    Molecular biology is a nanotechnology that works--it has worked for billions of years and in an amazing variety of circumstances. At its core is a system for acquiring, processing and communicating information that is universal, from viruses and bacteria to human beings. Advances in genetics and experience in designing computers have taken us to a stage where we can understand the optimisation principles at the root of this system, from the availability of basic building blocks to the execution of tasks. The languages of DNA and proteins are argued to be the optimal solutions to the information processing tasks they carry out. The analysis also suggests simpler predecessors to these languages, and provides fascinating clues about their origin. Obviously, a comprehensive unraveling of the puzzle of life would have a lot to say about what we may design or convert ourselves into.Comment: (v1) 33 pages, contributed chapter to "Quantum Aspects of Life", edited by D. Abbott, P. Davies and A. Pati, (v2) published version with some editin

    Surface-renewal models for heat-transfer between walls and fluidized beds

    Get PDF
    Two surface-renewed film penetration models describe transient heat-transfer between a wall and a fluidized bed. Methods are presented for estimation of mean residence times of particles at the transporting surface, their age densities and the average transport coefficients

    Narrative approaches to design multi-screen augmented reality experiences

    Get PDF
    This paper explores how traditional narrative language used in film and theatre can be adapted to create interactivity and a greater sense of presence in the virtual heritage environment. It focuses on the fundamental principles of narrative required to create immersion and presence and investigates methods of embedding intangible social histories into these environments. These issues are explored in a case study of Greens Mill in the 1830’s, interweaving the story of the reform bill riots in Nottingham with the life of George Green, mathematician and proprietor of the Mill

    Narrating the past: virtual environments and narrative

    Get PDF
    This paper explores how traditional narrative language used in film and theatre can be adapted to create interactivity and a greater sense of presence in the virtual heritage environment. It focuses on the fundamental principles of narrative required to create immersion and presence and investigates methods of embedding intangible social histories into these environments. These issues are explored in a case study of Greens Mill in the 1830’s, interweaving the story of the reform bill riots in Nottingham with the life of George Green, mathematician and proprietor of the Mill

    A Transverse Lattice QCD Model for Mesons

    Full text link
    QCD is analysed with two light-front continuum dimensions and two transverse lattice dimensions. In the limit of large number of colours and strong transverse gauge coupling, the contributions of light-front and transverse directions factorise in the dynamics, and the theory can be analytically solved in a closed form. An integral equation is obtained, describing the properties of mesons, which generalises the 't Hooft equation by including spin degrees of freedom. The meson spectrum, light-front wavefunctions and form factors can be obtained by solving this equation numerically. These results would be a good starting point to model QCD observables which only weakly depend on transverse directions, e.g. deep inelastic scattering structure functions.Comment: Lattice 2003 (theory), 3 page

    Are Galaxies Optically Thin to Their Own Lyman Continuum Radiation? II. NGC 6822

    Full text link
    Halpha and UBV photometry of NGC 6822 are used to study the distribution of OB stars and HII regions in the galaxy and to determine whether individual regions of the galaxy are in a state of ionization balance. Four distinct components of the Halpha emission (bright, halo, diffuse and field) differentiated by their surface brightnesses are identified. We find that approximately 1/2 of all OB stars in NGC 6822 are located in the field while only 1/4 are found in the combined bright and halo regions, suggesting that OB stars spend roughly 3/4 of their lifetimes outside ``classical'' H II regions. Comparing the observed Halpha emission with that predicted from stellar ionizing flux models, we find that although the bright, halo and diffuse regions are probably in ionization balance, the field region is producing at least 6 times as much ionizing flux as is observed. The ionization balance results in NGC 6822 suggest that star formation rates obtained from Halpha luminosities must underestimate the true star formation rate in this galaxy by about 50%. Comparing our results for NGC 6822 with previous results for the spiral galaxy M33, we find that the inner kiloparsec of M33 is in a more serious state of ionization imbalance, perhaps due to its higher surface density of blue stars.Comment: Replaced version should now compile with standard aastex style files. 28 pages, aastex preprint format. Accepted in ApJ. Hardcopies of figures available on request to [email protected]
    corecore